A Gauss-Seidel Iterative Thresholding Algorithm for lq Regularized Least Squares Regression
نویسندگان
چکیده
In recent studies on sparse modeling, lq (0 < q < 1) regularized least squares regression (lqLS) has received considerable attention due to its superiorities on sparsity-inducing and bias-reduction over the convex counterparts. In this paper, we propose a Gauss-Seidel iterative thresholding algorithm (called GAITA) for solution to this problem. Different from the classical iterative thresholding algorithms using the Jacobi updating rule, GAITA takes advantage of the GaussSeidel rule to update the coordinate coefficients. Under a mild condition, we can justify that the support set and sign of an arbitrary sequence generated by GAITA will converge within finite iterations. This convergence property together with the KurdykaLojasiewicz property of (lqLS) naturally yields the strong convergence of GAITA under the same condition as above, which is generally weaker than the condition for the convergence of the classical iterative thresholding algorithms. Furthermore, we demonstrate that GAITA converges to a local minimizer under certain additional conditions. A set of numerical experiments are provided to show the effectiveness, particularly, much faster convergence of GAITA as compared with the classical iterative thresholding algorithms.
منابع مشابه
Efficient ℓq Minimization Algorithms for Compressive Sensing Based on Proximity Operator
This paper considers solving the unconstrained lq-norm (0 ≤ q < 1) regularized least squares (lq-LS) problem for recovering sparse signals in compressive sensing. We propose two highly efficient first-order algorithms via incorporating the proximity operator for nonconvex lq-norm functions into the fast iterative shrinkage/thresholding (FISTA) and the alternative direction method of multipliers...
متن کاملGauss-Seidel Iterative Methods for Rank Deficient Least Squares Problems
We study the semiconvergence of Gauss-Seidel iterative methods for the least squares solution of minimal norm of rank deficient linear systems of equations. Necessary and sufficient conditions for the semiconvergence of the Gauss-Seidel iterative method are given. We also show that if the linear system of equations is consistent, then the proposed methods with a zero vector as an initial guess ...
متن کاملGauss-Seidel Estimation of Generalized Linear Mixed Models with Application to Poisson Modeling of Spatially Varying Disease Rates
Generalized linear mixed models (GLMMs) are often fit by computational procedures such as penalized quasi-likelihood (PQL). Special cases of GLMMs are generalized linear models (GLMs), which are often fit using algorithms like iterative weighted least squares (IWLS). High computational costs and memory space constraints make it difficult to apply these iterative procedures to data sets having a...
متن کاملImproved fast Gauss transform User manual
In most kernel based machine learning algorithms and non-parametric statistics the key computational task is to compute a linear combination of local kernel functions centered on the training data, i.e., f(x) = ∑N i=1 qik(x, xi), which is the discrete Gauss transform for the Gaussian kernel. f is the regression/classification function in case of regularized least squares, Gaussian process regre...
متن کاملA variable fixing version of the two-block nonlinear constrained Gauss-Seidel algorithm for \(\ell _1\) -regularized least-squares
The problem of finding sparse solutions to underdetermined systems of linear equations is very common in many fields as e.g. in signal/image processing and statistics. A standard tool for dealing with sparse recovery is the l1-regularized least-squares approach that has recently attracted the attention of many researchers. In this paper, we describe a new version of the two-block nonlinear cons...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1507.03173 شماره
صفحات -
تاریخ انتشار 2015